67 research outputs found

    Head in the Clouds-Floating Locomotion in Virtual Reality

    Get PDF
    Navigating large virtual spaces within the confines of a small tracked volume in a seated position becomes a serious accessibility issue when users' lower seating position reduces their visibility and makes it uncomfortable to reach for items with ease. Hence, we propose a 'floating' accessibility technique, in which a seated VR user experiences the virtual environment from the perspective of a standing eye height. We conducted a user study comparing sitting, standing and floating conditions and observed that the floating technique had no detrimental effect in comparison to the standing technique and had a slight benefit over the sitting technique

    Profiling Distributed Virtual Environments by Tracing Causality

    Get PDF
    Real-time interactive systems such as virtual environments have high performance requirements, and profiling is a key part of the optimisation process to meet them. Traditional techniques based on metadata and static analysis have difficulty following causality in asynchronous systems. In this paper we explore a new technique for such systems. Timestamped samples of the system state are recorded at instrumentation points at runtime. These are assembled into a graph, and edges between dependent samples recovered. This approach minimises the invasiveness of the instrumentation, while retaining high accuracy. We describe how our instrumentation can be implemented natively in common environments, how its output can be processed into a graph describing causality, and how heterogeneous data sources can be incorporated into this to maximise the scope of the profiling. Across three case studies, we demonstrate the efficacy of this approach, and how it supports a variety of metrics for comprehensively bench-marking distributed virtual environments

    A novel experimental design of a real-time VR tracking device

    Get PDF
    Virtual Reality (VR) is progressively adopted at different stages of design and product development. Consequently, evolving interaction requirements in engineering design and development for VR are essential for technology adoption. One of these requirements is real-time positional tracking. This paper aims to present an experimental design of a new real-time positional tracking device (tracker), that is more compact than the existing solution, while addressing factors such as wearability and connectivity. We compare the simulation of the proposed device and the existing solution, discuss the results, and the limitations. The new experimental shape of the device is tailored towards research, allowing the engineering designer to take advantage of a new tracker alternative in new ways, and opens the door to new VR applications in research and product development

    Action Sounds Modulate Arm Reaching Movements

    Get PDF
    Our mental representations of our body are continuously updated through multisensory bodily feedback as we move and interact with our environment. Although it is often assumed that these internal models of body-representation are used to successfully act upon the environment, only a few studies have actually looked at how body-representation changes influence goal-directed actions, and none have looked at this in relation to body-representation changes induced by sound. The present work examines this question for the first time. Participants reached for a target object before and after adaptation periods during which the sounds produced by their hand tapping a surface were spatially manipulated to induce a representation of an elongated arm. After adaptation, participants' reaching movements were performed in a way consistent with having a longer arm, in that their reaching velocities were reduced. These kinematic changes suggest auditory-driven recalibration of the somatosensory representation of the arm morphology. These results provide support to the hypothesis that one's represented body size is used as a perceptual ruler to measure objects' distances and to accordingly guide bodily actions

    An ‘In the Wild’ Experiment on Presence and Embodiment using Consumer Virtual Reality Equipment

    Get PDF
    Consumer virtual reality systems are now becoming widely available. We report on a study on presence and embodiment within virtual reality that was conducted ‘in the wild’, in that data was collected from devices owned by consumers in uncontrolled settings, not in a traditional laboratory setting. Users of Samsung Gear VR and Google Cardboard devices were invited by web pages and email invitation to download and run an app that presented a scenario where the participant would sit in a bar watching a singer. Each participant saw one of eight variations of the scenario: with or without a self-avatar; singer inviting the participant to tap along or not; singer looking at the participant or not. Despite the uncontrolled situation of the experiment, results from an in-app questionnaire showed tentative evidence that a self-avatar had a positive effect on self-report of presence and embodiment, and that the singer inviting the participant to tap along had a negative effect on self-report of embodiment. We discuss the limitations of the study and the platforms, and the potential for future open virtual reality experiments

    Immersive competence and immersive literacy: Exploring how users learn about immersive experiences

    Get PDF
    While immersive experiences mediated through near-eye displays are still a relatively immature medium, there are millions of consumer devices in use. The level of awareness of the forms of the interface and media will vary enormously across the potential audience. Users might own personal devices or might encounter immersive systems in various venues. We introduce the term immersive competence to refer to the general practical knowledge and skills that users accumulate about how typical immersive interfaces work—the ways in which buttons are used, main locomotion techniques, etc. We then introduce the term immersive literacy to refer to awareness of how immersive interfaces are unique, when they might be appropriate, typical forms of media, etc. We sketch out how users develop competence and literacy with immersive media, and then highlight various open questions that are raised

    Full Body Acting Rehearsal in a Networked Virtual Environment-A Case Study

    Get PDF
    In order to rehearse for a play or a scene from a movie, it is generally required that the actors are physically present at the same time in the same place. In this paper we present an example and experience of a full body motion shared virtual environment (SVE) for rehearsal. The system allows actors and directors to meet in an SVE in order to rehearse scenes for a play or a movie, that is, to perform some dialogue and blocking (positions, movements, and displacements of actors in the scene) rehearsal through a full body interactive virtual reality (VR) system. The system combines immersive VR rendering techniques as well as network capabilities together with full body tracking. Two actors and a director rehearsed from separate locations. One actor and the director were in London (located in separate rooms) while the second actor was in Barcelona. The Barcelona actor used a wide field-of-view head-tracked head-mounted display, and wore a body suit for real-time motion capture and display. The London actor was in a Cave system, with head and partial body tracking. Each actor was presented to the other as an avatar in the shared virtual environment, and the director could see the whole scenario on a desktop display, and intervene by voice commands. A video stream in a window displayed in the virtual environment also represented the director. The London participant was a professional actor, who afterward commented on the utility of the system for acting rehearsal. It was concluded that full body tracking and corresponding real-time display of all the actors' movements would be a critical requirement, and that blocking was possible down to the level of detail of gestures. Details of the implementation, actors, and director experiences are provided

    The effect of virtual reality on visual vertigo symptoms in patients with peripheral vestibular dysfunction: a pilot study

    Get PDF
    Individuals with vestibular dysfunction may experience visual vertigo (VV), in which symptoms are provoked or exacerbated by excessive or disorientating visual stimuli (e.g. supermarkets). VV can significantly improve when customized vestibular rehabilitation exercises are combined with exposure to optokinetic stimuli. Virtual reality (VR), which immerses patients in realistic, visually challenging environments, has also been suggested as an adjunct to VR to improve VV symptoms. This pilot study compared the responses of sixteen patients with unilateral peripheral vestibular disorder randomly allocated to a VR regime incorporating exposure to a static (Group S) or dynamic (Group D) VR environment. Participants practiced vestibular exercises, twice weekly for four weeks, inside a static (Group S) or dynamic (Group D) virtual crowded square environment, presented in an immersive projection theatre (IPT), and received a vestibular exercise program to practice on days not attending clinic. A third Group D1 completed both the static and dynamic VR training. Treatment response was assessed with the Dynamic Gait Index and questionnaires concerning symptom triggers and psychological state. At final assessment, significant betweengroup differences were noted between Groups D (p = 0.001) and D1 (p = 0.03) compared to Group S for VV symptoms with the former two showing a significant 59.2% and 25.8% improvement respectively compared to 1.6% for the latter. Depression scores improved only for Group S (p = 0.01) while a trend towards significance was noted for Group D regarding anxiety scores (p = 0.07). Conclusion: Exposure to dynamic VR environments should be considered as a useful adjunct to vestibular rehabilitation programs for patients with peripheral vestibular disorders and VV symptoms

    Ubiq: A System to Build Flexible Social Virtual Reality Experiences

    Get PDF
    While they have long been a subject of academic study, social virtual reality (SVR) systems are now attracting increasingly large audiences on current consumer virtual reality systems. The design space of SVR systems is very large, and relatively little is known about how these systems should be constructed in order to be usable and efficient. In this paper we present Ubiq, a toolkit that focuses on facilitating the construction of SVR systems. We argue for the design strategy of Ubiq and its scope. Ubiq is built on the Unity platform. It provides core functionality of many SVR systems such as connection management, voice, avatars, etc. However, its design remains easy to extend. We demonstrate examples built on Ubiq and how it has been successfully used in classroom teaching. Ubiq is open source (Apache License) and thus enables several use cases that commercial systems cannot

    Ubiq: A System to Build Flexible Social Virtual Reality Experiences

    Get PDF
    While they have long been a subject of academic study, social virtual reality (SVR) systems are now attracting increasingly large audiences on current consumer virtual reality systems. The design space of SVR systems is very large, and relatively little is known about how these systems should be constructed in order to be usable and efficient. In this paper we present Ubiq, a toolkit that focuses on facilitating the construction of SVR systems. We argue for the design strategy of Ubiq and its scope. Ubiq is built on the Unity platform. It provides core functionality of many SVR systems such as connection management, voice, avatars, etc. However, its design remains easy to extend. We demonstrate examples built on Ubiq and how it has been successfully used in classroom teaching. Ubiq is open source (Apache License) and thus enables several use cases that commercial systems cannot
    corecore